Guided tutorials for IBM Event Automation

This set of guided tutorials will take you through the key features of Event Automation.

Set up the environment

Quickly create a demo Event Automation environment, including topics with live streams of events that you can use to try the tutorials.

1 - Filter events based on particular properties

When processing events we can use filter operations to select input events.

2 - Transform events to create or remove properties

When processing events we can use transform operations to refine input events.

3 - Aggregate events to detect trends over time

Processing events over a time-window allows you to build a summary view of a situation which can be useful to identify overall trends.

4 - Join related events within time windows

Many interesting situations need us to combine multiple streams of events that correlate events across these inputs to derive a new, interesting situation.

5 - Automate actions based on event triggers

Once your event processing has identified a situation of interest, a common next step is to automate a response. You can write the output of your processing flows to a Kafka topic to achieve this.

6 - Share events for discovery by others

Publish the results of your event processing to the Catalog to allow them to be shared and reused by others.

Tutorials for Event Processing

Learn about how Event Processing can help you transform and act on event data.

Enrich events with reference data

Enriching a stream of events with static reference data from a database can increase the business relevance of events.

Generating custom properties for custom filters

Writing custom filter expressions that use dynamically generated properties added to events can help to identify specific situations.

Identify combinations of events occurring multiple times

Aggregate processors can identify events that are interesting if they occur multiple times within a time window.

Deduplicating repeated events

Some systems offer at-least-once assurances, and will occasionally generate duplicate events. Processing these to remove duplicates can enable consumption by systems that cannot behave idempotently.

Identify trends from events

Identifying types of events that occur most frequently within a time window is a useful way to detect trends.

Process out-of-sequence events

Events generated from a wide range of producers can be out of sequence on the topic, making it important to resolve this before time-sensitive processing.

Triggering automations from processing results

You can use App Connect to trigger automations and notifications from event processing results.

Processing IBM MQ messages

IBM MQ queues and topics are a valuable source of events for processing.

Using complex events with nested properties and arrays

Handling complex events with nested properties and arrays to identify a situation and extract only the necessary information.

Monitoring Flink with Prometheus and Grafana

Find out how to monitor Flink with Prometheus and setup Grafana.

Handle evolving formats by using a schema registry

Connect to a schema registry to process a stream of events with formats that change over time.

Unpacking arrays from events to process array contents easily

Unpack each array element into separate events or into new properties in the same event to process the array content.

Detection of key and headers in a Kafka topic message

Detect key and headers for your Kafka topic messages and define them as properties in event source.

Tutorials for Event Streams

Find out how to get more out of Event Streams with these practical tutorials.

Running Kafka Streams applications

Learn how to run Kafka Streams applications in Event Streams.

Monitoring Event Streams cluster health with Prometheus and Grafana

Set up Prometheus to monitor your Event Streams installations and visualize the data through Grafana.

Monitoring cluster health with Datadog

Monitor the health of your cluster by using Datadog to capture Kafka broker JMX metrics.

Monitoring cluster health with a Splunk HTTP Event Collector

Monitor the health of your cluster by using Splunk to capture Kafka broker JMX metrics.

Monitoring cluster health with Splunk

Monitor the health of your cluster by using Splunk to capture Kafka broker JMX metrics.

Setting up alert notifications to Slack

Receive notifications about the health of your cluster based on monitored metrics.

Installing a multizone cluster

See an example of setting up a multizone Event Streams in a non-zone-aware cluster.